Steepest descent on real flag manifolds
نویسنده
چکیده
Among the compact homogeneous spaces, a very distinguished subclass is formed by the (generalized) real flag manifolds which by definition are the orbits of the isotropy representations of Riemannian symmetric spaces (sorbits). This class contains most compact symmetric spaces (e.g. all hermitian ones), all classical flag manifolds over real, complex and quaternionic vector spaces, all adjoint orbits of compact Lie groups (generalized complex flag manifolds) and many others. They form the main examples for isoparametric submanifolds and their focal manifolds (so called constant principal curvature manifolds); in fact for most codimensions these are the only such spaces (cf. [7], [9], [8]). Any real flag manifold M enjoys two very peculiar geometric properties: It carries a transitive action of a noncompact Lie group G, and it is embedded in euclidean space as a taut submanifold, i.e. almost all height or coordinate functions are perfect Morse functions (at least for Z/2-coefficients). The aim of our paper is to link these two properties by the following theorem: The gradient flow of any height function is a one-parameter subgroup of G where the gradient is defined with respect to a suitable homogeneous metric s on M ; in the case where M is an adjoint orbit s is a homogeneous Kähler metric. In other words, the lines of steepest descend for the height function (gradient flow lines) are obtained by applying a one-parameter subgroup of G. This is an elementary fact when M is a euclidean sphere and G its conformal group: The gradient of any height function is a conformal vector field. For adjoint orbits, the (generalized) complex flag manifolds, this fact was observed earlier by Guest and Ohnita [4]. Our more general result can be derived from their theorem since real flag manifolds are contained in complex flag manifolds
منابع مشابه
Convex functions on symmetric spaces and geometric invariant theory for weighted configurations on flag manifolds
3 Convex functions on symmetric spaces 11 3.1 Geometric preliminaries . . . . . . . . . . . . . . . . . . . . . . . . . 11 3.1.1 Metric spaces with curvature bounds . . . . . . . . . . . . . . 11 3.1.2 Hadamard spaces . . . . . . . . . . . . . . . . . . . . . . . . . 12 3.1.3 Symmetric spaces of noncompact type . . . . . . . . . . . . . 15 3.1.4 Auxiliary results . . . . . . . . . . . . . . . ....
متن کاملSteepest descent method for quasiconvex minimization on Riemannian manifolds
This paper extends the full convergence of the steepest descent algorithm with a generalized Armijo search and a proximal regularization to solve quasiconvex minimization problems defined on complete Riemannian manifolds. Previous convergence results are obtained as particular cases of our approach and some examples in non Euclidian spaces are given.
متن کاملA Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems
In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...
متن کاملSteepest descent method on a Riemannian manifold: the convex case
In this paper we are interested in the asymptotic behavior of the trajectories of the famous steepest descent evolution equation on Riemannian manifolds. It writes ẋ (t) + gradφ (x (t)) = 0. It is shown how the convexity of the objective function φ helps in establishing the convergence as time goes to infinity of the trajectories towards points that minimize φ. Some numerical illustrations are ...
متن کاملHybrid steepest-descent method with sequential and functional errors in Banach space
Let $X$ be a reflexive Banach space, $T:Xto X$ be a nonexpansive mapping with $C=Fix(T)neqemptyset$ and $F:Xto X$ be $delta$-strongly accretive and $lambda$- strictly pseudocotractive with $delta+lambda>1$. In this paper, we present modified hybrid steepest-descent methods, involving sequential errors and functional errors with functions admitting a center, which generate convergent sequences ...
متن کامل